A superlinearly convergent R-regularized Newton scheme for variational models with concave sparsity-promoting priors
نویسندگان
چکیده
A general class of variational models with concave priors is considered for obtaining certain sparse solutions, for which nonsmoothness and non-Lipschitz continuity of the objective functions pose significant challenges from an analytical as well as numerical point of view. For computing a stationary point of the underlying variational problem, a Newton-type scheme with provable convergence properties is proposed. The possible non-positive definiteness of the generalized Hessian is handled by a tailored regularization technique, which is motivated by reweighting as well as the classical trust-region method. Our numerical experiments demonstrate selected applications in image processing, support vector machines, and optimal control of partial differential equations.
منابع مشابه
Sucient Optimality Conditions and Semi-Smooth Newton Methods for Optimal Control of Stationary Variational Inequalities
In this paper sufficient second order optimality conditions for optimal control problems subject to stationary variational inequalities of obstacle type are derived. Since optimality conditions for such problems always involve measures as Lagrange multipliers, which impede the use of efficient Newton type methods, a family of regularized problems is introduced. Second order sufficient optimalit...
متن کاملSufficient Optimality Conditions and Semi-Smooth Newton Methods for Optimal Control of Stationary Variational Inequalities
In this paper sufficient second order optimality conditions for optimal control problems subject to stationary variational inequalities of obstacle type are derived. Since optimality conditions for such problems always involve measures as Lagrange multipliers, which impede the use of efficient Newton type methods, a family of regularized problems is introduced. Second order sufficient optimalit...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملA Class of Globally Convergent Algorithms for Pseudomonotone Variational Inequalities
We describe a fairly broad class of algorithms for solving variational inequalities, global convergence of which is based on the strategy of generating a hyperplane separating the current iterate from the solution set. The methods are shown to converge under very mild assumptions. Specifically, the problem mapping is only assumed to be continuous and pseudomonotone with respect to at least one ...
متن کاملA Superlinearly Convergent Smoothing Newton Continuation Algorithm for Variational Inequalities over Definable Sets
In this paper, we use the concept of barrier-based smoothing approximations introduced by Chua and Li [SIAM J. Optim., 23 (2013), pp. 745–769] to extend the smoothing Newton continuation algorithm of Hayashi, Yamashita, and Fukushima [SIAM J. Optim., 15 (2005), pp. 593– 615] to variational inequalities over general closed convex sets X. We prove that when the underlying barrier has a gradient m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Comp. Opt. and Appl.
دوره 57 شماره
صفحات -
تاریخ انتشار 2014